New Times,
New Thinking.

  1. Science & Tech
3 April 2020updated 08 Sep 2021 10:52am

Why we need “informational distancing” during the coronavirus crisis

Stepping back and taking time to consider the media we consume could help to slow the spread of misinformation about the pandemic. 

By Nina Jankowicz

The World Health Organisation recently launched a WhatsApp chatbot to inform the world about coronavirus. It links users to reports on the latest case count, explains proper hygiene techniques, and debunks the mounting myths about the pandemic. “Hand dryers are NOT effective in killing the coronavirus,” it admonishes. “Ultraviolet light SHOULD NOT be used for sterilisation and can cause skin irritation.” The disinformation extends beyond would-be cures; false text messages circulate warning of the imminent imposition of martial law in New York or in London. Politicians both foreign and domestic attempt to assign blame for the virus, which they alternatively claim was created in a lab, was released to disrupt global trade, or unleashed to undermine elections.

We’ve seen unprecedented cooperation and action among the social media platforms, governments and international organisations to combat the spread of malign information. And yet it proliferates. We cannot fact-check our way out of this crisis, because in the onslaught of Covid-19 disinformation, the most important weapon is not fakes, it’s fear. And unlike toilet paper and pasta, there’s no shortage of it these days. Disinformation thrives on emotional manipulation, and like the pandemic and the social distancing we are using to combat it, the world needs informational distancing to slow the spread of the coronavirus infodemic. 

There is a widespread misconception – driven, in part, by President Trump’s insistence on using the term “fake news” – that disinformation consists of entirely fabricated, outlandish stories. A 2019 Pew Research study found that half of Americans believe that “made-up news and information” is a very big problem in the country today, ranking above violent crime and climate change. To combat the problem, 78 per cent of those surveyed said that they “check the facts in news stories themselves”. 

But disinformation is never simply about fact vs fiction. The key to its success is not the quality of the yarn disinformers spin, the believability of the fake personae they use to amplify it, or the effectiveness of the ads used to boost their posts; it is human nature. We are being emotionally manipulated. 

In 2016, Russia’s disinformation campaign in the United States was successful in influencing the political discourse, reaching over 30 million Americans and generating over 78 million interactions. This success, contrary to popular belief, was not because it reportedly spent $100,000 on Facebook ads.  An Oxford Internet Institute study found that it was organic engagement, not advertising, that drove the bulk of the Russian Internet Research Agency’s (IRA) campaign. The agency was able to garner such high engagement because it had spent years cultivating a community around emotionally charged, politically divisive issues like police brutality, immigration, racism, and gay rights, as Congressionally-released ads and posts revealed.

Much of the content shared by the IRA early on in the 2016 presidential campaign was not fake or even outwardly malicious; it consisted of positive posts to engender camaraderie between users around a particular issue. As the election neared, the IRA made bigger and bigger asks of its communities, using punchy language to the target group’s emotions. For example, on the “Being Patriotic” Facebook, alleged by many to have been run by the IRA, one post on the removal of a confederate monument appealed to homophobes who were fearful of losing their “history” and place in American society. “The USA is out of balance now,” it read. “The gays, lesbians, and trans-sexuals [sic] have more rights than other folks.” To many in the group, the post would have served as visceral emotional truth.

By 2016, it seemed that Russia was practised at emotional manipulation, which it had deployed across Europe for more than a decade, capitalising on feelings of disenfranchisement among ethnic Russians in Estonia, corruption in Ukraine since the Maidan revolution, and worries about migration across Western Europe. But never did it have such a fertile field in which to plant seeds of distrust and untruth than the coronavirus crisis. Everyone is fearful. We are afraid we’ll get sick or that our loved ones will. Traffic to social media platforms like Facebook has ballooned as we seek human connection and any scrap of information we can find to knit together into a quilt of confidence and comfort. It’s no surprise to see an explosion in disinformation about the pandemic; the return on investment is just too high for disinformers to pass up. That’s why China is attempting to shirk the fact that the virus originated in Wuhan, why Iran has tried to cover up how bad its domestic situation is, and why Russia is stepping in once again to amplify panic and distrust; with CNN reporting that since late January,  the EU’s External Action Service has recorded almost 80 cases of coronavirus disinformation linked to pro-Kremlin media.

Give a gift subscription to the New Statesman this Christmas from just £49

Governments are not the only combatants in the coronavirus information war. In addition to domestic hucksters peddling miracle cures, the US president has gotten involved, and his emotional manipulation megaphone is louder and more effective than that of any of our adversaries. President Trump’s insistence on calling Covid-19 “the Chinese virus” appeals to his base’s prejudices and boosts his rhetoric for his ongoing trade war with Beijing. As he admitted when he called in to Fox and Friends this week, the phrase – which is endangering Asian Americans – is his own calculated gamut in the disinformation game. 

Fear has turned off our informational filters, and we cannot “wash away” the effects of emotional manipulation while we are in crisis mode, as “informational hygiene” techniques suggest. But we can put space between ourselves and the manipulators. We should take similar precautions with both our physical and informational health, recognising that disinformation runs on our feelings. In short, we must start practising informational distancing. Rather than allowing ourselves to be played by bad actors, when we feel emotion rising, we should step away from the screen and take a deep breath (or ten). By putting actual physical distance between ourselves and our devices, we’re not only allowing our emotions time to cool off, we’re interrupting the algorithmic tyranny on which disinformation runs, keeping us scrolling through our neverending news feeds and recommendation rabbit holes. And most importantly, we ourselves are not amplifying disinformation by sharing it. 

If, after some distance, we still find ourselves pondering those “ten weird tips to beat Covid-19” we should verify our information through a reliable, non-partisan, professional source. In the case of coronavirus, that means doctors and scientists, not politicians and pundits. 

Informational distancing takes work, and it’s only a small part of the information ecosystem that caused the coronavirus infodemic in the first place. To truly flatten the informational curve, we need the participation and cooperation of both traditional and social media to cover and amplify information in a responsible way. Most critically, we need recognition from politicians that disinformation is not a technique they should employ, in or out of crisis.

The ability to debunk a “fake” won’t save us from this or any future infodemic, but an awareness of the human foibles on which disinformation feeds might stem its spread. 

Nina Jankowicz is the Disinformation Fellow at the Wilson Center and the author of the forthcoming How to Lose the Information War (Bloomsbury/IBTauris July 2020).

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football

Topics in this article :